Outliers robustness in multivariate orthogonal regression

نویسنده

  • Giuseppe Carlo Calafiore
چکیده

This paper deals with the problem of multivariate affine regression in the presence of outliers in the data. The method discussed is based on weighted orthogonal least squares. The weights associated with the data satisfy a suitable optimality criterion and are computed by a two-step algorithm requiring a RANSAC step and a gradient-based optimization step. Issues related to the breakdown point of the method are discussed, and examples of application on various real multidimensional data sets are reported in the paper.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Robustness of reweighted Least Squares Kernel Based Regression

Kernel Based Regression (KBR) minimizes a convex risk over a possibly infinite dimensional reproducing kernel Hilbert space. Recently it was shown that KBR with a least squares loss function may have some undesirable properties from a robustness point of view: even very small amounts of outliers can dramatically affect the estimates. KBR with other loss functions is more robust, but often gives...

متن کامل

Orthogonal linear regression algorithm based on augmented matrix formulation

Scope and Purpose : In this paper, a new technique for solving a multivariate linear model using the orthogonal least absolute values regression is proposed. The orthogonal least absolute values (ORLAV) regression minimises the sum of the absolute, orthogonal distance from each data point to the resulting regression hyperplane. In a large set of equations where the variables are independent of ...

متن کامل

Robust Estimation in Linear Regression Model: the Density Power Divergence Approach

The minimum density power divergence method provides a robust estimate in the face of a situation where the dataset includes a number of outlier data. In this study, we introduce and use a robust minimum density power divergence estimator to estimate the parameters of the linear regression model and then with some numerical examples of linear regression model, we show the robustness of this est...

متن کامل

A Forward Regression Algorithm based on M-estimators

This paper introduces an orthogonal forward regression (OFR) model structure selection algorithm based on the Mestimators. The basic idea of the proposed approach is to incorporate an IRLS inner loop into the modified GramSchmidt procedure. In this manner the OFR algorithm is extended to bad data conditions with improved performance due to M-estimators’ inherent robustness to outliers. An illus...

متن کامل

Identification of outliers types in multivariate time series using genetic algorithm

Multivariate time series data, often, modeled using vector autoregressive moving average (VARMA) model. But presence of outliers can violates the stationary assumption and may lead to wrong modeling, biased estimation of parameters and inaccurate prediction. Thus, detection of these points and how to deal properly with them, especially in relation to modeling and parameter estimation of VARMA m...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • IEEE Trans. Systems, Man, and Cybernetics, Part A

دوره 30  شماره 

صفحات  -

تاریخ انتشار 2000